Training Factored PCFGs with Expectation Propagation

نویسندگان

  • David Leo Wright Hall
  • Dan Klein
چکیده

PCFGs can grow exponentially as additional annotations are added to an initially simple base grammar. We present an approach where multiple annotations coexist, but in a factored manner that avoids this combinatorial explosion. Our method works with linguisticallymotivated annotations, induced latent structure, lexicalization, or any mix of the three. We use a structured expectation propagation algorithm that makes use of the factored structure in two ways. First, by partitioning the factors, it speeds up parsing exponentially over the unfactored approach. Second, it minimizes the redundancy of the factors during training, improving accuracy over an independent approach. Using purely latent variable annotations, we can efficiently train and parse with up to 8 latent bits per symbol, achieving F1 scores up to 88.4 on the Penn Treebank while using two orders of magnitudes fewer parameters compared to the naı̈ve approach. Combining latent, lexicalized, and unlexicalized annotations, our best parser gets 89.4 F1 on all sentences from section 23 of the Penn Treebank.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-Local Modeling with a Mixture of PCFGs

While most work on parsing with PCFGs has focused on local correlations between tree configurations, we attempt to model non-local correlations using a finite mixture of PCFGs. A mixture grammar fit with the EM algorithm shows improvement over a single PCFG, both in parsing accuracy and in test data likelihood. We argue that this improvement comes from the learning of specialized grammars that ...

متن کامل

Estimating Probabilities in PCFGs

◮ find P̂(N j → ζ) = C(N j→ζ) ∑ γ C(N j→γ) ◮ C(X) = count of how often rule X is used ◮ no annotation ⇒ no rule counts! =̂ hidden data problem – similar to Hidden Markov Models ◮ start with some initial rule probabilities, parse training sentences, use parse probabilities as indicator of confidence ◮ find expectation of how often a rule is used ◮ based on these expectations, maximize probabilities:

متن کامل

Collapsed Variational Bayesian Inference for PCFGs

This paper presents a collapsed variational Bayesian inference algorithm for PCFGs that has the advantages of two dominant Bayesian training algorithms for PCFGs, namely variational Bayesian inference and Markov chain Monte Carlo. In three kinds of experiments, we illustrate that our algorithm achieves close performance to the Hastings sampling algorithm while using an order of magnitude less t...

متن کامل

A Separate-and-Learn Approach to EM Learning of PCFGs

Wepropose a new approach to EM learning of PCFGs. We completely separate the process of EM learning from that of parsing, and for the former, we introduce a new EM algorithm called the graphical EM algorithm that runs on a new data structure called support graphs extracted from WFSTs (well formed substring tables) of various parsers. Learning experiments with PCFGs using two Japanese corpora in...

متن کامل

Parsing with PCFGs

The PCFG model is without doubt the most important formal model in syntactic parsing today, not only because it is widely used in itself but also because many later developments start from it. In this lecture, I will first introduce the basic formalism (§1) and the parsing model that naturally follows from it (§2). I will then give an overview of standard techniques for parsing (§3), for superv...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012